Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🎉 New Source: Notion #7092

Merged
merged 16 commits into from
Nov 18, 2021
Merged

Conversation

burmecia
Copy link
Contributor

@burmecia burmecia commented Oct 16, 2021

What

How

Add four streams for this source:

  1. Users
  2. Databases
  3. Pages
  4. Blocks

Recommended reading order

  1. bootstrap.md
  2. spec.json
  3. Schemas under schemas/*.json
  4. source.py
  5. streams.py

Test Runs

unit test
unit-test

integration test
int-test

Pre-merge Checklist

Expand the relevant checklist and delete the others.

New Connector

Community member or Airbyter

  • Community member? Grant edit access to maintainers (instructions)
  • Secrets in the connector's spec are annotated with airbyte_secret
  • Unit & integration tests added and passing. Community members, please provide proof of success locally e.g: screenshot or copy-paste unit, integration, and acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run ./gradlew :airbyte-integrations:connectors:<name>:integrationTest.
  • Code reviews completed
  • Documentation updated
    • Connector's README.md
    • Connector's bootstrap.md. See description and examples
    • docs/SUMMARY.md
    • docs/integrations/<source or destination>/<name>.md including changelog. See changelog example
    • docs/integrations/README.md
    • airbyte-integrations/builds.md
  • PR name follows PR naming conventions
  • Connector added to connector index like described here

Airbyter

If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.

  • Create a non-forked branch based on this PR and test the below items on it
  • Build is successful
  • Credentials added to Github CI. Instructions.
  • /test connector=connectors/<name> command is passing.
  • New Connector version released on Dockerhub by running the /publish command described here

Updating a connector

Community member or Airbyter

  • Grant edit access to maintainers (instructions)
  • Secrets in the connector's spec are annotated with airbyte_secret
  • Unit & integration tests added and passing. Community members, please provide proof of success locally e.g: screenshot or copy-paste unit, integration, and acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run ./gradlew :airbyte-integrations:connectors:<name>:integrationTest.
  • Code reviews completed
  • Documentation updated
    • Connector's README.md
    • Connector's bootstrap.md. See description and examples
    • Changelog updated in docs/integrations/<source or destination>/<name>.md including changelog. See changelog example
  • PR name follows PR naming conventions
  • Connector version bumped like described here

Airbyter

If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.

  • Create a non-forked branch based on this PR and test the below items on it
  • Build is successful
  • Credentials added to Github CI. Instructions.
  • /test connector=connectors/<name> command is passing.
  • New Connector version released on Dockerhub by running the /publish command described here

Connector Generator

  • Issue acceptance criteria met
  • PR name follows PR naming conventions
  • If adding a new generator, add it to the list of scaffold modules being tested
  • The generator test modules (all connectors with -scaffold in their name) have been updated with the latest scaffold by running ./gradlew :airbyte-integrations:connector-templates:generator:testScaffoldTemplates then checking in your changes
  • Documentation which references the generator is updated as needed.

@github-actions github-actions bot added area/connectors Connector related issues area/documentation Improvements or additions to documentation labels Oct 16, 2021
@burmecia burmecia changed the title add source notion 🎉 New Source: Notion Oct 16, 2021
@marcosmarxm
Copy link
Member

Sorry for the delay @burmecia ! I added this to our sprint to be reviewed. Is it possible to share the integration credentials to tests (you can send in Slack to me).
Please make sure to run ./gradlew format to apply default formatting.

@burmecia
Copy link
Contributor Author

Thank you @marcosmarxm , I've sent it to you on Slack.

Comment on lines 21 to 23
"has_children": {
"type": "boolean"
},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

most cases a common pattern is used as: ["null", "boolean"] only use type: boolean if you are sure this field wont have null values.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

=========================== short test summary info ============================
FAILED test_core.py::TestBasicRead::test_read[inputs0] - jsonref.JsonRefError...
FAILED test_full_refresh.py::TestFullRefresh::test_sequential_reads[inputs0]
=================== 2 failed, 12 passed in 91.99s (0:01:31) ====================

Unit test are passing! But Integraiton test are failing. I tested with your credential and ours. Let me know if you need help debugging.

@burmecia
Copy link
Contributor Author

=========================== short test summary info ============================
FAILED test_core.py::TestBasicRead::test_read[inputs0] - jsonref.JsonRefError...
FAILED test_full_refresh.py::TestFullRefresh::test_sequential_reads[inputs0]
=================== 2 failed, 12 passed in 91.99s (0:01:31) ====================

Unit test are passing! But Integraiton test are failing. I tested with your credential and ours. Let me know if you need help debugging.

I just tested on my side and it is working, can you confirm the latest code is used and the test command is below?

python -m pytest -p integration_tests.acceptance

and if possible, can you send me your acceptance_tests_logs folder? Thanks.

Comment on lines +1 to +4
{
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "user.json"
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sherifnada this is supported by CDK? Running python acceptance_tests pass but when I submit running ./gradlew airbyte-integrations:connectors:source-notion:integrationTest the test failed. Only start working when I use the full schema (copy from user.json).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marcosmarxm yes it's supported. Can you post the failure logs?

Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small changes, only having some problems with user stream but soon we can merge this.

@marcosmarxm
Copy link
Member

@burmecia and @sherifnada sorry for delay in logs.

./gradlew airbyte-integrations:connectors:source-notion:integrationTest

> Task :airbyte-integrations:connectors:source-notion:sourceAcceptanceTest
============================= test session starts ==============================
platform linux -- Python 3.7.12, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /test_input
plugins: sugar-0.9.4, timeout-1.4.2
collected 14 items

test_core.py ..........F                                                 [ 78%]
test_full_refresh.py F                                                   [ 85%]
test_incremental.py ..                                                   [100%]

=================================== FAILURES ===================================
_______________________ TestBasicRead.test_read[inputs0] _______________________

self = <[<JsonRefError: "ValueError: unknown url type: ''"> raised in repr()] JsonRef object at 0x7f8bb20789d0>

    @property
    def __subject__(self):
        try:
>           return self.cache

/usr/local/lib/python3.7/site-packages/proxytypes.py:252: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <[<JsonRefError: "ValueError: unknown url type: ''"> raised in repr()] JsonRef object at 0x7f8bb20789d0>
attr = 'cache'

    def __getattribute__(self, attr):
        if Proxy._should_proxy(self, attr):
            return getattr(self.__subject__, attr)
>       return _oga(self, attr)
E       AttributeError: 'JsonRef' object has no attribute 'cache'

/usr/local/lib/python3.7/site-packages/proxytypes.py:176: AttributeError

During handling of the above exception, another exception occurred:

self = <[<JsonRefError: "ValueError: unknown url type: ''"> raised in repr()] JsonRef object at 0x7f8bb20789d0>

    def callback(self):
        uri, fragment = urlparse.urldefrag(self.full_uri)
    
        # If we already looked this up, return a reference to the same object
        if uri in self.store:
            result = self.resolve_pointer(self.store[uri], fragment)
        else:
            # Remote ref
            try:
>               base_doc = self.loader(uri)

/usr/local/lib/python3.7/site-packages/jsonref.py:178: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <jsonref.JsonLoader object at 0x7f8bb2df6a10>, uri = '', kwargs = {}

    def __call__(self, uri, **kwargs):
        """
        Return the loaded JSON referred to by `uri`
    
        :param uri: The URI of the JSON document to load
        :param kwargs: Keyword arguments passed to :func:`json.loads`
    
        """
        if uri in self.store:
            return self.store[uri]
        else:
>           result = self.get_remote_json(uri, **kwargs)

/usr/local/lib/python3.7/site-packages/jsonref.py:299: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <jsonref.JsonLoader object at 0x7f8bb2df6a10>, uri = '', kwargs = {}
scheme = ''

    def get_remote_json(self, uri, **kwargs):
        scheme = urlparse.urlsplit(uri).scheme
    
        if scheme in ["http", "https"] and requests:
            # Prefer requests, it has better encoding detection
            try:
                result = requests.get(uri).json(**kwargs)
            except TypeError:
                warnings.warn("requests >=1.2 required for custom kwargs to json.loads")
                result = requests.get(uri).json()
        else:
            # Otherwise, pass off to urllib and assume utf-8
>           result = json.loads(urlopen(uri).read().decode("utf-8"), **kwargs)

/usr/local/lib/python3.7/site-packages/jsonref.py:316: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url = '', data = None, timeout = <object object at 0x7f8bb4a83280>

    def urlopen(url, data=None, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
                *, cafile=None, capath=None, cadefault=False, context=None):
        '''Open the URL url, which can be either a string or a Request object.
    
        *data* must be an object specifying additional data to be sent to
        the server, or None if no such data is needed.  See Request for
        details.
    
        urllib.request module uses HTTP/1.1 and includes a "Connection:close"
        header in its HTTP requests.
    
        The optional *timeout* parameter specifies a timeout in seconds for
        blocking operations like the connection attempt (if not specified, the
        global default timeout setting will be used). This only works for HTTP,
        HTTPS and FTP connections.
    
        If *context* is specified, it must be a ssl.SSLContext instance describing
        the various SSL options. See HTTPSConnection for more details.
    
        The optional *cafile* and *capath* parameters specify a set of trusted CA
        certificates for HTTPS requests. cafile should point to a single file
        containing a bundle of CA certificates, whereas capath should point to a
        directory of hashed certificate files. More information can be found in
        ssl.SSLContext.load_verify_locations().
    
        The *cadefault* parameter is ignored.
    
        This function always returns an object which can work as a context
        manager and has methods such as
    
        * geturl() - return the URL of the resource retrieved, commonly used to
          determine if a redirect was followed
    
        * info() - return the meta-information of the page, such as headers, in the
          form of an email.message_from_string() instance (see Quick Reference to
          HTTP Headers)
    
        * getcode() - return the HTTP status code of the response.  Raises URLError
          on errors.
    
        For HTTP and HTTPS URLs, this function returns a http.client.HTTPResponse
        object slightly modified. In addition to the three new methods above, the
        msg attribute contains the same information as the reason attribute ---
        the reason phrase returned by the server --- instead of the response
        headers as it is specified in the documentation for HTTPResponse.
    
        For FTP, file, and data URLs and requests explicitly handled by legacy
        URLopener and FancyURLopener classes, this function returns a
        urllib.response.addinfourl object.
    
        Note that None may be returned if no handler handles the request (though
        the default installed global OpenerDirector uses UnknownHandler to ensure
        this never happens).
    
        In addition, if proxy settings are detected (for example, when a *_proxy
        environment variable like http_proxy is set), ProxyHandler is default
        installed and makes sure the requests are handled through the proxy.
    
        '''
        global _opener
        if cafile or capath or cadefault:
            import warnings
            warnings.warn("cafile, capath and cadefault are deprecated, use a "
                          "custom context instead.", DeprecationWarning, 2)
            if context is not None:
                raise ValueError(
                    "You can't pass both context and any of cafile, capath, and "
                    "cadefault"
                )
            if not _have_ssl:
                raise ValueError('SSL support not available')
            context = ssl.create_default_context(ssl.Purpose.SERVER_AUTH,
                                                 cafile=cafile,
                                                 capath=capath)
            https_handler = HTTPSHandler(context=context)
            opener = build_opener(https_handler)
        elif context:
            https_handler = HTTPSHandler(context=context)
            opener = build_opener(https_handler)
        elif _opener is None:
            _opener = opener = build_opener()
        else:
            opener = _opener
>       return opener.open(url, data, timeout)

/usr/local/lib/python3.7/urllib/request.py:222: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib.request.OpenerDirector object at 0x7f8bb2238b10>, fullurl = ''
data = None, timeout = <object object at 0x7f8bb4a83280>

    def open(self, fullurl, data=None, timeout=socket._GLOBAL_DEFAULT_TIMEOUT):
        # accept a URL or a Request object
        if isinstance(fullurl, str):
>           req = Request(fullurl, data)

/usr/local/lib/python3.7/urllib/request.py:510: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib.request.Request object at 0x7f8bb2e0bf10>, url = '', data = None
headers = {}, origin_req_host = None, unverifiable = False, method = None

    def __init__(self, url, data=None, headers={},
                 origin_req_host=None, unverifiable=False,
                 method=None):
>       self.full_url = url

/usr/local/lib/python3.7/urllib/request.py:328: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib.request.Request object at 0x7f8bb2e0bf10>, url = ''

    @full_url.setter
    def full_url(self, url):
        # unwrap('<URL:type://host/path>') --> 'type://host/path'
        self._full_url = unwrap(url)
        self._full_url, self.fragment = splittag(self._full_url)
>       self._parse()

/usr/local/lib/python3.7/urllib/request.py:354: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib.request.Request object at 0x7f8bb2e0bf10>

    def _parse(self):
        self.type, rest = splittype(self._full_url)
        if self.type is None:
>           raise ValueError("unknown url type: %r" % self.full_url)
E           ValueError: unknown url type: ''

/usr/local/lib/python3.7/urllib/request.py:383: ValueError

The above exception was the direct cause of the following exception:

self = <source_acceptance_test.tests.test_core.TestBasicRead object at 0x7f8bb1eea150>
connector_config = SecretDict(******)
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='users', json_schema={'$ref': '#/d...l: 'incremental'>, cursor_field=None, destination_sync_mode=<DestinationSyncMode.append: 'append'>, primary_key=None)])
inputs = BasicReadTestConfig(config_path='secrets/config.json', configured_catalog_path='integration_tests/configured_catalog.json', empty_streams=set(), expect_records=None, validate_schema=True, timeout_seconds=None)
expected_records = []
docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7f8bb2247810>
detailed_logger = <Logger detailed_logger /test_input/acceptance_tests_logs/test_core.py__TestBasicRead__test_read[inputs0].txt (DEBUG)>

    def test_read(
        self,
        connector_config,
        configured_catalog,
        inputs: BasicReadTestConfig,
        expected_records: List[AirbyteMessage],
        docker_runner: ConnectorRunner,
        detailed_logger,
    ):
        output = docker_runner.call_read(connector_config, configured_catalog)
        records = [message.record for message in filter_output(output, Type.RECORD)]
    
        assert records, "At least one record should be read using provided catalog"
    
        if inputs.validate_schema:
>           self._validate_schema(records=records, configured_catalog=configured_catalog)

/usr/local/lib/python3.7/site-packages/source_acceptance_test/tests/test_core.py:265: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/local/lib/python3.7/site-packages/source_acceptance_test/tests/test_core.py:201: in _validate_schema
    TestBasicRead._validate_records_structure(records, configured_catalog)
/usr/local/lib/python3.7/site-packages/source_acceptance_test/tests/test_core.py:186: in _validate_records_structure
    schemas[stream.stream.name] = set(get_expected_schema_structure(stream.stream.json_schema))
/usr/local/lib/python3.7/site-packages/source_acceptance_test/utils/json_schema_helper.py:195: in get_expected_schema_structure
    _scan_schema(schema)
/usr/local/lib/python3.7/site-packages/source_acceptance_test/utils/json_schema_helper.py:171: in _scan_schema
    if "oneOf" in subschema or "anyOf" in subschema:
/usr/local/lib/python3.7/site-packages/proxytypes.py:202: in proxied
    args.insert(arg_pos, self.__subject__)
/usr/local/lib/python3.7/site-packages/proxytypes.py:176: in __getattribute__
    return _oga(self, attr)
/usr/local/lib/python3.7/site-packages/proxytypes.py:134: in wrapper
    return method(self, *args, **kwargs)
/usr/local/lib/python3.7/site-packages/proxytypes.py:254: in __subject__
    self.cache = super(LazyProxy, self).__subject__
/usr/local/lib/python3.7/site-packages/proxytypes.py:134: in wrapper
    return method(self, *args, **kwargs)
/usr/local/lib/python3.7/site-packages/proxytypes.py:240: in __subject__
    return self.callback()
/usr/local/lib/python3.7/site-packages/proxytypes.py:134: in wrapper
    return method(self, *args, **kwargs)
/usr/local/lib/python3.7/site-packages/jsonref.py:180: in callback
    self._error("%s: %s" % (e.__class__.__name__, unicode(e)), cause=e)
/usr/local/lib/python3.7/site-packages/proxytypes.py:134: in wrapper
    return method(self, *args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <[<JsonRefError: "ValueError: unknown url type: ''"> raised in repr()] JsonRef object at 0x7f8bb20789d0>
message = "ValueError: unknown url type: ''"
cause = ValueError("unknown url type: ''")

    def _error(self, message, cause=None):
        raise JsonRefError(
            message,
            self.__reference__,
            uri=self.full_uri,
            base_uri=self.base_uri,
            path=self.path,
>           cause=cause,
        )
E       jsonref.JsonRefError: ValueError: unknown url type: ''

/usr/local/lib/python3.7/site-packages/jsonref.py:229: JsonRefError
________________ TestFullRefresh.test_sequential_reads[inputs0] ________________

self = <source_acceptance_test.tests.test_full_refresh.TestFullRefresh object at 0x7f8bb1f3e310>
connector_config = SecretDict(******)
configured_catalog = ConfiguredAirbyteCatalog(streams=[ConfiguredAirbyteStream(stream=AirbyteStream(name='users', json_schema={'$ref': '#/d...: 'full_refresh'>, cursor_field=None, destination_sync_mode=<DestinationSyncMode.append: 'append'>, primary_key=None)])
docker_runner = <source_acceptance_test.utils.connector_runner.ConnectorRunner object at 0x7f8bb1f3e790>
detailed_logger = <Logger detailed_logger /test_input/acceptance_tests_logs/test_full_refresh.py__TestFullRefresh__test_sequential_reads[inputs0].txt (DEBUG)>

    def test_sequential_reads(self, connector_config, configured_catalog, docker_runner: ConnectorRunner, detailed_logger):
        configured_catalog = full_refresh_only_catalog(configured_catalog)
        output = docker_runner.call_read(connector_config, configured_catalog)
        records_1 = [message.record.data for message in output if message.type == Type.RECORD]
    
        output = docker_runner.call_read(connector_config, configured_catalog)
        records_2 = [message.record.data for message in output if message.type == Type.RECORD]
    
        output_diff = set(map(serialize, records_1)) - set(map(serialize, records_2))
        if output_diff:
            msg = "The two sequential reads should produce either equal set of records or one of them is a strict subset of the other"
            detailed_logger.info(msg)
            detailed_logger.log_json_list(output_diff)
>           pytest.fail(msg)
E           Failed: The two sequential reads should produce either equal set of records or one of them is a strict subset of the other

/usr/local/lib/python3.7/site-packages/source_acceptance_test/tests/test_full_refresh.py:27: Failed
------------------------------ Captured log call -------------------------------
INFO     detailed_logger /test_input/acceptance_tests_logs/test_full_refresh.py__TestFullRefresh__test_sequential_reads[inputs0].txt:test_full_refresh.py:25 The two sequential reads should produce either equal set of records or one of them is a strict subset of the other
INFO     detailed_logger /test_input/acceptance_tests_logs/test_full_refresh.py__TestFullRefresh__test_sequential_reads[inputs0].txt:conftest.py:164 [
 {
  "object": "page",
  "id": "c09a65d7-d520-4ac3-931b-ec1be4f77139",
  "created_time": "2021-10-19T15:13:00.000Z",
  "last_edited_time": "2021-10-19T15:19:00.000Z",
  "cover": null,
  "icon": {
   "type": "file",
   "file": {
    "url": "https://s3.us-west-2.amazonaws.com/secure.notion-static.com/54fb2c2a-fcbc-4386-bd78-469dccc60430/Screenshot_2021-10-19_at_16.42.29.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAT73L2G45O3KS52Y5%2F20211026%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20211026T033819Z&X-Amz-Expires=3600&X-Amz-Signature=50a85a563ed1361f1ec4b98e1112166baddf0a26f03f30778f520676d267f2d9&X-Amz-SignedHeaders=host",
    "expiry_time": "2021-10-26T04:38:19.473Z"
   }
  },
  "parent": {
   "type": "page_id",
   "page_id": "30597108-b046-4c09-a42b-cb78f6cc3972"
  },
  "archived": false,
  "properties": {
   "title": {
    "id": "title",
    "type": "title",
    "title": [
     {
      "type": "text",
      "text": {
       "content": "Airbyte",
       "link": null
      },
      "annotations": {
       "bold": false,
       "italic": false,
       "strikethrough": false,
       "underline": false,
       "code": false,
       "color": "default"
      },
      "plain_text": "Airbyte",
      "href": null
     }
    ]
   }
  },
  "url": "https://www.notion.so/Airbyte-c09a65d7d5204ac3931bec1be4f77139"
 }
]
=========================== short test summary info ============================
FAILED test_core.py::TestBasicRead::test_read[inputs0] - jsonref.JsonRefError...
FAILED test_full_refresh.py::TestFullRefresh::test_sequential_reads[inputs0]
=================== 2 failed, 12 passed in 91.17s (0:01:31) ====================

> Task :airbyte-integrations:connectors:source-notion:sourceAcceptanceTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':airbyte-integrations:connectors:source-notion:sourceAcceptanceTest'.
> Process 'command 'docker'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2m 25s

test_full_refresh.py__TestFullRefresh__test_sequential_reads[inputs0] file

The two sequential reads should produce either equal set of records or one of them is a strict subset of the other
[
 {
  "object": "page",
  "id": "c09a65d7-d520-4ac3-931b-ec1be4f77139",
  "created_time": "2021-10-19T15:13:00.000Z",
  "last_edited_time": "2021-10-19T15:19:00.000Z",
  "cover": null,
  "icon": {
   "type": "file",
   "file": {
    "url": "https://s3.us-west-2.amazonaws.com/secure.notion-static.com/54fb2c2a-fcbc-4386-bd78-469dccc60430/Screenshot_2021-10-19_at_16.42.29.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAT73L2G45O3KS52Y5%2F20211026%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20211026T034239Z&X-Amz-Expires=3600&X-Amz-Signature=04a9b9b47069d9e52330dbd61629ba87b09c37fea4583792745a3387c0efae5c&X-Amz-SignedHeaders=host",
    "expiry_time": "2021-10-26T04:42:39.896Z"
   }
  },
  "parent": {
   "type": "page_id",
   "page_id": "30597108-b046-4c09-a42b-cb78f6cc3972"
  },
  "archived": false,
  "properties": {
   "title": {
    "id": "title",
    "type": "title",
    "title": [
     {
      "type": "text",
      "text": {
       "content": "Airbyte",
       "link": null
      },
      "annotations": {
       "bold": false,
       "italic": false,
       "strikethrough": false,
       "underline": false,
       "code": false,
       "color": "default"
      },
      "plain_text": "Airbyte",
      "href": null
     }
    ]
   }
  },
  "url": "https://www.notion.so/Airbyte-c09a65d7d5204ac3931bec1be4f77139"
 }
]

@burmecia
Copy link
Contributor Author

burmecia commented Nov 4, 2021

Hi @marcosmarxm, I tested using the same command as yours, but it is also successful on my laptop. Could that be your environment issue?

The command I used:

./gradlew airbyte-integrations:connectors:source-notion:integrationTest

It generates quite lengthy output, but the final result is success.
Screen Shot 2021-11-04 at 11 29 59 pm

@marcosmarxm marcosmarxm self-assigned this Nov 8, 2021
@marcosmarxm
Copy link
Member

Hello @burmecia thanks! I got one problem sync a Private Workspace, in your integration do you have one?

@burmecia
Copy link
Contributor Author

burmecia commented Nov 8, 2021

No I didn't see that problem, do you have error log?

@marcosmarxm
Copy link
Member

Hello! Sorry to not finished your contribution before the date stipulated in the contest. All contributions made before 15-November are eligible to receive the award. We're trying to review your contribution as soon as possible.

Copy link
Member

@marcosmarxm marcosmarxm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @burmecia sorry the delay in review this amazing contribution!

@marcosmarxm marcosmarxm temporarily deployed to more-secrets November 18, 2021 03:19 Inactive
@marcosmarxm marcosmarxm temporarily deployed to more-secrets November 18, 2021 04:04 Inactive
@marcosmarxm marcosmarxm temporarily deployed to more-secrets November 18, 2021 04:06 Inactive
@marcosmarxm marcosmarxm merged commit 595ed6b into airbytehq:master Nov 18, 2021
schlattk pushed a commit to schlattk/airbyte that referenced this pull request Jan 4, 2022
* add source notion

* update PR number in change log

* bug fix and code improvement as code review suggestions

* code improvement as review advices

* new connector notion

* format

* correct creds file

* run seed

* bump connector version

* format

Co-authored-by: Marcos Marx <marcosmarxm@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Status: No status
Development

Successfully merging this pull request may close these issues.

5 participants